From Dual to Primal Sub-optimality for Regularized Empirical Risk Minimization

نویسنده

  • Ching-pei Lee
چکیده

Regularized empirical risk minimization problems are fundamental tasks in machine learning and data analysis. Many successful approaches for solving these problems are based on a dual formulation, which often admits more efficient algorithms. Often, though, the primal solution is needed. In the case of regularized empirical risk minimization, there is a convenient formula for reconstructing an approximate primal solution from the approximate dual solution. However, the question of quantifying the suboptimality of the primal solution so obtained, and how it relates to sub-optimality of the approximate dual solution, has not been well studied. This paper presents two results. First, we show that when the primal has Lipschitz continuous gradient, we can recover an O( )-sub-optimal primal solution from an O( )-sub-optimal dual solution. Second, when the primal is Lipschitz continuous, we can recover an O( √ )-sub-optimal primal solution from an O( )-sub-optimal dual solution. These results imply that an algorithm that is linearly convergent for the dual problem can yield primal iterates that converge R-linearly to the solution of the primal.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Doubly Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization with Factorized Data

We proposed a doubly stochastic primal-dual coordinate optimization algorithm for regularized empirical risk minimization that can be formulated as a saddlepoint problem. Different from existing coordinate methods, the proposed method randomly samples both primal and dual coordinates to update solutions, which is a desirable property when applied to data with both a high dimension and a large s...

متن کامل

Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization

We consider a generic convex optimization problem associated with regularized empirical risk minimization of linear predictors. The problem structure allows us to reformulate it as a convex-concave saddle point problem. We propose a stochastic primal-dual coordinate method, which alternates between maximizing over one (or more) randomly chosen dual variable and minimizing over the primal variab...

متن کامل

Dual Free SDCA for Empirical Risk Minimization with Adaptive Probabilities

In this paper we develop dual free SDCA with adaptive probabilities for regularized empirical risk minimization. This extends recent work of Shai Shalev-Shwartz [SDCA without Duality, arXiv:1502.06177] to allow non-uniform selection of ”dual” coordinate in SDCA. Moreover, the probability can change over time, making it more efficient than uniform selection. Our work focuses on generating adapti...

متن کامل

Primal Method for ERM with Flexible Mini-batching Schemes and Non-convex Losses

In this work we develop a new algorithm for regularized empirical risk minimization. Our method extends recent techniques of Shalev-Shwartz [02/2015], which enable a dual-free analysis of SDCA, to arbitrary mini-batching schemes. Moreover, our method is able to better utilize the information in the data defining the ERM problem. For convex loss functions, our complexity results match those of Q...

متن کامل

Exploiting Strong Convexity from Data with Primal-Dual First-Order Algorithms

We consider empirical risk minimization of linear predictors with convex loss functions. Such problems can be reformulated as convex-concave saddle point problems, and thus are well suitable for primal-dual first-order algorithms. However, primal-dual algorithms often require explicit strongly convex regularization in order to obtain fast linear convergence, and the required dual proximal mappi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016